377 research outputs found

    Quantum-Dot Cascade Laser: Proposal for an Ultra-Low-Threshold Semiconductor Laser

    Full text link
    We propose a quantum-dot version of the quantum-well cascade laser of Faist et al. [Science {\bf 264}, 553 (1994)]. The elimination of single phonon decays by the three-dimensional confinement implies a several order-of-magnitude reduction in the threshold current. The requirements on dot size (10-20nm) and on dot density and uniformity [one coupled pair of dots per (180nm)^3 with 5% nonuniformity] are close to current technology.Comment: 8 pages, REVTEX 3.0, 3 compressed postscript figure

    Design of Three-Dimensional, Path Length Matched Optical Waveguides

    Get PDF
    A method for designing physically path length matched, three-dimensional photonic circuits is described. These waveguides, with arbitrary endpoints, were fabricated via the femtosecond laser direct-write technique. The focus is specifically on the case where all waveguides are uniquely routed from the input to output; a problem which has not been addressed to date and allows for the waveguides to be used in interferometric measurements. Two iterative design methods were created for path length matched waveguides with adequate separation in three dimensions and minimized curvature. These algorithms could be used to calculate predicted radius of curvature, bend and transition loss in the waveguides, with results confirmed by computer simulation methods. Demonstrations via interferometric methods show that the fabricated circuits were indeed optically path length matched to within 45 μm which is well within the coherence length required for typical applications, includi ng astronomical measurements

    Design of Three-Dimensional, Path Length Matched Optical Waveguides

    Get PDF
    A method for designing physically path length matched, three-dimensional photonic circuits is described. These waveguides, with arbitrary endpoints, were fabricated via the femtosecond laser direct-write technique. The focus is specifically on the case where all waveguides are uniquely routed from the input to output; a problem which has not been addressed to date and allows for the waveguides to be used in interferometric measurements. Two iterative design methods were created for path length matched waveguides with adequate separation in three dimensions and minimized curvature. These algorithms could be used to calculate predicted radius of curvature, bend and transition loss in the waveguides, with results confirmed by computer simulation methods. Demonstrations via interferometric methods show that the fabricated circuits were indeed optically path length matched to within 45 μm which is well within the coherence length required for typical applications, includi ng astronomical measurements

    Reliability and Uncertainty in Diffusion MRI Modelling

    Get PDF
    Current Diffusion MRI studies often utilise more complex models beyond the single exponential decay model used in clinical standards. As this thesis shows, however, two of these models, biexponential and kurtosis, experience mathematical, ill-conditioning issues that can arise when used with regression algorithms, causing extreme bias and/or variance in the parameter estimates. Using simulated noisy data measurements from known truth, the magnitude of the bias and variance was shown to vary based on signal parameters as well as SNR, and increasing the SNR did not reduce this uncertainty for all data. Parameter estimate reliability could not be assessed from a single regression fit in all cases unless bootstrap resampling was performed, in which case measurements with high parameter estimate uncertainty were successfully identified. Prior to data analysis, current studies may use information criteria or cross-validation model selection methods to establish the best model to assess a specific tissue condition. While the best selection method to use is currently unclear in the literature, when testing simulated data in this thesis, no model selection method performed more reliably than the others and these methods were merely biased toward either simpler or more complex models. When a specific model was used to generate simulated noisy data, no model selection method selected this true model for all signals, and the ability of these methods to select the true model also varied depending on the true signal parameters. The results from these simulated data analyses were applied to ex vivo data from excised prostate tissue, and both information criteria measures and bootstrap sample distributions were able to identify image voxels whose parameter estimates had likely reliability issues. Removing these voxels from analysis improved sample variance of the parameter estimates

    High-performance 3D waveguide architecture for astronomical pupil-remapping interferometry

    Full text link
    The detection and characterisation of extra-solar planets is a major theme driving modern astronomy, with the vast majority of such measurements being achieved by Doppler radial-velocity and transit observations. Another technique -- direct imaging -- can access a parameter space that complements these methods, and paves the way for future technologies capable of detailed characterization of exoplanetary atmospheres and surfaces. However achieving the required levels of performance with direct imaging, particularly from ground-based telescopes which must contend with the Earth's turbulent atmosphere, requires considerable sophistication in the instrument and detection strategy. Here we demonstrate a new generation of photonic pupil-remapping devices which build upon the interferometric framework developed for the {\it Dragonfly} instrument: a high contrast waveguide-based device which recovers robust complex visibility observables. New generation Dragonfly devices overcome problems caused by interference from unguided light and low throughput, promising unprecedented on-sky performance. Closure phase measurement scatter of only 0.2\sim 0.2^\circ has been achieved, with waveguide throughputs of >70%> 70\%. This translates to a maximum contrast-ratio sensitivity (between the host star and its orbiting planet) at 1λ/D1 \lambda/D (1σ\sigma detection) of 5.3×1045.3 \times 10^{-4} (when a conventional adaptive-optics (AO) system is used) or 1.8×1041.8 \times 10^{-4} (for typical `extreme-AO' performance), improving even further when random error is minimised by averaging over multiple exposures. This is an order of magnitude beyond conventional pupil-segmenting interferometry techniques (such as aperture masking), allowing a previously inaccessible part of the star to planet contrast-separation parameter space to be explored

    Pulse radiolysis of liquid water using picosecond electron pulses produced by a table-top terawatt laser system

    Get PDF
    A laser based electron generator is shown, for the first time, to produce sufficient charge to conduct time resolved investigations of radiation induced chemical events. Electron pulses generated by focussing terawatt laser pulses into a supersonic helium gas jet are used to ionize liquid water. The decay of the hydrated electrons produced by the ionizing electron pulses is monitored with 0.3 μs time resolution. Hydrated electron concentrations as high as 22 μM were generated. The results show that terawatt lasers offer both an alternative to linear accelerators and a means to achieve subpicosecond time resolution for pulse radiolysis studies. © 2000 American Institute of Physics.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/69949/2/RSINAK-71-6-2305-1.pd

    Developing a keystroke biometric system for continual authentication of computer users,”

    Get PDF
    Abstract-Data windows of keyboard input are analyzed to continually authenticate computer users and verify that they are the authorized ones. Because the focus is on fast intruder detection, the authentication process operates on short bursts of roughly a minute of keystroke input, while the training process can be extensive and use hours of input. The biometric system consists of components for data capture, feature extraction, authentication classification, and receiveroperating-characteristic curve generation. Using keystroke data from 120 users, system performance was obtained as a function of two independent variables: the user population size and the number of keystrokes per sample. For each population size, the performance increased (and the equal error rate decreased) roughly logarithmically as the number of keystrokes per sample was increased. The best closed-system performance results of 99 percent on 14 participants and 96 percent on 30 participants indicate the potential of this approach
    corecore